-
Notifications
You must be signed in to change notification settings - Fork 88
[BUZZOK-28206] Fix broken requirements.txt in GenAI Agents environment (Need to remove all ; things in requirements.txt)
#1724
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUZZOK-28206] Fix broken requirements.txt in GenAI Agents environment (Need to remove all ; things in requirements.txt)
#1724
Conversation
| ptyprocess==0.7.0 ; os_name != 'nt' or (sys_platform != 'emscripten' and sys_platform != 'win32') | ||
| pure-eval==0.2.3 | ||
| py-rust-stemmers==0.1.5 | ||
| protobuf==5.29.4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Risk: Affected versions of protobuf are vulnerable to Uncontrolled Recursion. The pure-Python implementation of Protocol Buffers is vulnerable to a denial-of-service attack when processing untrusted data with deeply nested or recursive groups/messages, potentially causing the Python recursion limit to be exceeded.
Manual Review Advice: A vulnerability from this advisory is reachable if you have setup the Protobuf pure-Python backend (the other backends are safe)
Fix: Upgrade this library to at least version 5.29.5 at datarobot-user-models/public_dropin_environments/python311_genai_agents/requirements.txt:83.
Reference(s): GHSA-8qvm-5x2c-j2w7, CVE-2025-4565
🍰 Fixed in commit 6d30203 🍰
; things in requirements.txt)
|
The Needs Review labels were added based on the following file changes. Team @datarobot/buzok (#genai) was assigned because of changes in files:public_dropin_environments/python311_genai_agents/env_info.json public_dropin_environments/python311_genai_agents/requirements.txt If you think that there are some issues with ownership, please discuss with C&A domain at #sdtk slack channel and create PR to update DRCODEOWNERS\CODEOWNERS file. |
jpclemens0
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Duplicated packages
| nvidia-nat-crewai==1.3.0rc3 ; python_full_version >= '3.11' | ||
| nvidia-nat-langchain==1.3.0rc3 ; python_full_version >= '3.11' | ||
| nvidia-nat-opentelemetry==1.3.0rc3 ; python_full_version >= '3.11' | ||
| numpy==1.26.4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Two numpy?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmmm... Yea, maybe this is a problem. The requirements.txt here is technically fake and it only exists in a python 3.11 image. The issue is that its hard to create a drum compatible requirements.txt from a uv.lock file. The two things don't really co-exist very easily together.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would go with just python 3.11 then if we are just generating a placeholder example of requirement.txt.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jpclemens0 So I reworked this so that the pyproject.toml that is exported to the datarobot-user-models from the af-component is ONLY a python 3.11 build now. We could look at upgrading this, but you made a really good point here.
We really only need to have uv.lock information, requirements.txt and a pyproject.toml for python 3.11 since that is the base environment installed. If a user is rebuilding the docker image, they are still using the core environment setups, and currently we only support a 3.11 base chainguard build pipeline.
| nvidia-nat-opentelemetry==1.3.0rc3 ; python_full_version >= '3.11' | ||
| numpy==1.26.4 | ||
| numpy==2.3.4 | ||
| nvidia-nat==1.3.0rc3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nat requires python >= 3.11
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you bump this to 1.3.0 that was released on Friday?
| s3transfer==0.13.1 | ||
| scipy==1.15.3 ; python_full_version < '3.11' | ||
| scipy==1.16.2 ; python_full_version >= '3.11' | ||
| scipy==1.15.3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Two scipy?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The removals here are all the non-python 3.11 stuff being yanked out.
jpclemens0
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM with an optional bump for NAT
| nvidia-nat-opentelemetry==1.3.0rc3 ; python_full_version >= '3.11' | ||
| numpy==1.26.4 | ||
| numpy==2.3.4 | ||
| nvidia-nat==1.3.0rc3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you bump this to 1.3.0 that was released on Friday?
|
@jpclemens0 Sure I can bump it to 1.3.0 |
f936300 to
502a878
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changing this to a uv pip compile flow to minimize the size of requirements.txt since we're over the limit with proper generation.
This repository is public. Do not put here any private DataRobot or customer's data: code, datasets, model artifacts, .etc.
Summary
Previous format seems to be incompatible with drum installer. We don't actually use this really, but we need it just for records